Search results for "Sliced inverse regression"

showing 6 items of 6 documents

A semiparametric approach to estimate reference curves for biophysical properties of the skin

2006

Reference curves which take one covariable into account such as the age, are often required in medicine, but simple systematic and efficient statistical methods for constructing them are lacking. Classical methods are based on parametric fitting (polynomial curves). In this chapter, we describe a new methodology for the estimation of reference curves for data sets, based on nonparametric estimation of conditional quantiles. The derived method should be applicable to all clinical or more generally biological variables that are measured on a continuous quantitative scale. To avoid the curse of dimensionality when the covariate is multidimensional, a new semiparametric approach is proposed. Th…

Statistics::TheoryKernel density estimationcomputer.software_genre01 natural sciences010104 statistics & probability0502 economics and businessCovariateSliced inverse regressionApplied mathematicsStatistics::MethodologySemiparametric regression0101 mathematics[SHS.ECO] Humanities and Social Sciences/Economics and Finance050205 econometrics MathematicsParametric statisticsDimensionality reduction05 social sciencesNonparametric statistics[ SDV.SPEE ] Life Sciences [q-bio]/Santé publique et épidémiologie[SHS.ECO]Humanities and Social Sciences/Economics and Finance3. Good health[SDV.SPEE] Life Sciences [q-bio]/Santé publique et épidémiologie[SDV.SPEE]Life Sciences [q-bio]/Santé publique et épidémiologieC140;C630Data miningcomputerQuantile
researchProduct

Some extensions of multivariate sliced inverse regression

2007

Multivariate sliced inverse regression (SIR) is a method for achieving dimension reduction in regression problems when the outcome variable y and the regressor x are both assumed to be multidimensional. In this paper, we extend the existing approaches, based on the usual SIR I which only uses the inverse regression curve, to methods using properties of the inverse conditional variance. Contrary to the existing ones, these new methods are not blind for symmetric dependencies and rely on the SIR II or SIRα. We also propose their corresponding pooled slicing versions. We illustrate the usefulness of these approaches on simulation studies.

Statistics and ProbabilityMultivariate statisticsApplied MathematicsDimensionality reductionInverseOutcome variableModeling and SimulationStatisticsSliced inverse regressionStatistics::MethodologyStatistics Probability and UncertaintyConditional varianceRegression problemsMathematicsRegression curveJournal of Statistical Computation and Simulation
researchProduct

A Statistical Matrix Representation Using Sliced Orthogonal Nonlinear Correlations for Pattern Recognition

2000

In pattern recognition, the choice of features to be detected is a critical factor to determine the success or failure of a method; much research has gone into finding the best features for particular tasks [1]. When images are detected by digital cameras, they are usually acquired as rectangular arrays of pixels, so the initial features are pixel values. Some methods use those pixel values directly for processing, for instance in normal matched filtering [2], whereas other methods execute some degree of pre-processing, such as binarizing the pixel values [3].

PixelDegree (graph theory)Computer sciencebusiness.industryCovariance matrixMatrix representationComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPattern recognitionNonlinear systemPattern recognition (psychology)Sliced inverse regressionComputer visionArtificial intelligencebusinessRepresentation (mathematics)
researchProduct

Asymptotic and bootstrap tests for subspace dimension

2022

Most linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices, see e.g. Ye and Weiss (2003), Tyler et al. (2009), Bura and Yang (2011), Liski et al. (2014) and Luo and Li (2016). The eigen-decomposition of one scatter matrix with respect to another is then often used to determine the dimension of the signal subspace and to separate signal and noise parts of the data. Three popular dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail and the first two moments of subsets of the eigenvalues are used to test…

FOS: Computer and information sciencesStatistics and ProbabilityPrincipal component analysisMathematics - Statistics TheoryStatistics Theory (math.ST)01 natural sciencesMethodology (stat.ME)010104 statistics & probabilityDimension (vector space)Scatter matrixSliced inverse regression0502 economics and businessFOS: MathematicsSliced inverse regressionApplied mathematics0101 mathematicsEigenvalues and eigenvectorsStatistics - Methodology050205 econometrics MathematicsestimointiNumerical AnalysisOrder determinationDimensionality reduction05 social sciencesriippumattomien komponenttien analyysimonimuuttujamenetelmätPrincipal component analysisStatistics Probability and UncertaintySubspace topologySignal subspace
researchProduct

Asymptotics for pooled marginal slicing estimator based on SIRα approach

2005

Pooled marginal slicing (PMS) is a semiparametric method, based on sliced inverse regression (SIR) approach, for achieving dimension reduction in regression problems when the outcome variable y and the regressor x are both assumed to be multidimensional. In this paper, we consider the SIR"@a version (combining the SIR-I and SIR-II approaches) of the PMS estimator and we establish the asymptotic distribution of the estimated matrix of interest. Then the asymptotic normality of the eigenprojector on the estimated effective dimension reduction (e.d.r.) space is derived as well as the asymptotic distributions of each estimated e.d.r. direction and its corresponding eigenvalue.

Statistics and ProbabilityNumerical AnalysisDimensionality reductionStatisticsSliced inverse regressionAsymptotic distributionEstimatorRegression analysisStatistics Probability and UncertaintyMarginal distributionEffective dimensionEigenvalues and eigenvectorsMathematicsJournal of Multivariate Analysis
researchProduct

On the usage of joint diagonalization in multivariate statistics

2022

Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance matrix. The simultaneous diagonalization of two or more scatter matrices goes beyond PCA and is used more and more often. In this paper, we offer an overview of many methods that are based on a joint diagonalization. These methods range from the unsupervised context with invariant coordinate selection and blind source separation, which includes independent component analysis, to the supervised context with discriminant analysis and sliced inverse regression. They also enco…

Statistics and ProbabilityScatter matricesMultivariate statisticsContext (language use)010103 numerical & computational mathematics01 natural sciencesBlind signal separation010104 statistics & probabilitySliced inverse regression0101 mathematicsB- ECONOMIE ET FINANCESupervised dimension reductionMathematicsNumerical Analysisbusiness.industryCovariance matrixPattern recognitionriippumattomien komponenttien analyysimatemaattinen tilastotiedeLinear discriminant analysisInvariant component selectionIndependent component analysismonimuuttujamenetelmätPrincipal component analysisDimension reductionBlind source separationArtificial intelligenceStatistics Probability and Uncertaintybusiness
researchProduct